Past Event: Oden Institute Seminar
Joe Kileel, Assistant Professor, Mathematics, Oden Institute, UT Austin
3:30 – 5PM
Thursday May 11, 2023
POB 6.304 & Zoom
Given a constrained optimization problem, we often tackle it by choosing a parameterization of the domain and then optimizing over the parameters. For example, if we wish to optimize a real-valued cost over bounded-rank matrices, then we can parameterize the domain using a low-rank factorization (e.g., the SVD) and then optimize over the factors. Alternatively, in machine learning when we optimize over the function space represented by a neural network, we have parameterized the space by the weights and biases in the network. In such situations, one natural question is: does the choice of parameterization affect the nonconvex optimization landscape? And: are some parameterizations better than others in terms of the nonconvexity? In this talk, I will present new tools to help answer these general questions. The theory will be applied to several examples, including the aforementioned ones as well as others in tensor optimization and semidefinite programming. Joint work with Eitan Levin (Caltech) and Nicolas Boumal (EPFL).
Joe Kileel is an assistant professor at UT Austin since 2020, based at the Oden Institute and Department of Mathematics. Previously he was a postdoc at Princeton University and a PhD student at UC Berkeley. His research interests cover: tensor/algebraic methods in data science; 3D reconstruction in imaging; and nonconvex optimization.